A self-organising network that grows when required
نویسندگان
چکیده
The ability to grow extra nodes is a potentially useful facility for a self-organising neural network. A network that can add nodes into its map space can approximate the input space more accurately, and often more parsimoniously, than a network with predefined structure and size, such as the Self-Organising Map. In addition, a growing network can deal with dynamic input distributions. Most of the growing networks that have been proposed in the literature add new nodes to support the node that has accumulated the highest error during previous iterations or to support topological structures. This usually means that new nodes are added only when the number of iterations is an integer multiple of some pre-defined constant, A. This paper suggests a way in which the learning algorithm can add nodes whenever the network in its current state does not sufficiently match the input. In this way the network grows very quickly when new data is presented, but stops growing once the network has matched the data. This is particularly important when we consider dynamic data sets, where the distribution of inputs can change to a new regime after some time. We also demonstrate the preservation of neighbourhood relations in the data by the network. The new network is compared to an existing growing network, the Growing Neural Gas (GNG), on a artificial dataset, showing how the network deals with a change in input distribution after some time. Finally, the new network is applied to several novelty detection tasks and is compared with both the GNG and an unsupervised form of the Reduced Coulomb Energy network on a robotic inspection task and with a Support Vector Machine on two benchmark novelty detection tasks.
منابع مشابه
: a Dynamic Incremental Network That Learns by Discrimination
An incremental learning algorithm for a special class of self-organising, dynamic networks is presented. Learning is effected by adapting both the function performed by the nodes and the overall network topology, so that the network grows (or shrinks) over time to fit the problem. Convergence is guaranteed on any arbitrary Boolean dataset and empirical generalisation results demonstrate promise.
متن کاملUnsupervised multimodal processing
We present two separate algorithms for unsupervised multimodal processing. Our first proposal, the singlepass Hebbian linked self-organising map network, significantly reduces the training of Hebbian-linked selforganising maps by computing in a single epoch the weights of the links associating the separate modal maps. Our second proposal, based on the counterpropagation network algorithm, imple...
متن کاملA self-organising view of manufacturing enterprises
Enterprises serve a purpose that is largely the reason as well as the result of its existence in a form that is most amenable for its sustenance. Despite this fact, seldom is the case that enterprises are designed or operated in a way that will bring out the best of its capabilities. The structure, normally designed by an external agency and to satisfy some instant of a future in mind is quite ...
متن کاملGeneralisation and discrimination emerge from a self-organising componential network: a speech example
It is demonstrated that a componential code emerges when a self-organising neural network is exposed to continuous speech. The code’s components correspond to substructures that occur relatively independently of one another: words and phones. A capability for generalisation and discrimination develops without having been optimised explicitly. The componential structure is revealed by optimising...
متن کاملDynamically Reconfigurable Online Self-organising Fuzzy Neural Network with Variable Number of Inputs for Smart Home Application
A self-organising fuzzy-neural network (SOFNN) adapts its structure based on variations of the input data. Conventionally in such self-organising networks, the number of inputs providing the data is fixed. In this paper, we consider the situation where the number of inputs to a network changes dynamically during its online operation. We extend our existing work on a SOFNN such that the SOFNN ca...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neural networks : the official journal of the International Neural Network Society
دوره 15 8-9 شماره
صفحات -
تاریخ انتشار 2002